50 research outputs found

    Neuromorphic Architecture Optimization for Task-Specific Dynamic Learning

    Full text link
    The ability to learn and adapt in real time is a central feature of biological systems. Neuromorphic architectures demonstrating such versatility can greatly enhance our ability to efficiently process information at the edge. A key challenge, however, is to understand which learning rules are best suited for specific tasks and how the relevant hyperparameters can be fine-tuned. In this work, we introduce a conceptual framework in which the learning process is integrated into the network itself. This allows us to cast meta-learning as a mathematical optimization problem. We employ DeepHyper, a scalable, asynchronous model-based search, to simultaneously optimize the choice of meta-learning rules and their hyperparameters. We demonstrate our approach with two different datasets, MNIST and FashionMNIST, using a network architecture inspired by the learning center of the insect brain. Our results show that optimal learning rules can be dataset-dependent even within similar tasks. This dependency demonstrates the importance of introducing versatility and flexibility in the learning algorithms. It also illuminates experimental findings in insect neuroscience that have shown a heterogeneity of learning rules within the insect mushroom body

    A Domain-Agnostic Approach for Characterization of Lifelong Learning Systems

    Full text link
    Despite the advancement of machine learning techniques in recent years, state-of-the-art systems lack robustness to "real world" events, where the input distributions and tasks encountered by the deployed systems will not be limited to the original training context, and systems will instead need to adapt to novel distributions and tasks while deployed. This critical gap may be addressed through the development of "Lifelong Learning" systems that are capable of 1) Continuous Learning, 2) Transfer and Adaptation, and 3) Scalability. Unfortunately, efforts to improve these capabilities are typically treated as distinct areas of research that are assessed independently, without regard to the impact of each separate capability on other aspects of the system. We instead propose a holistic approach, using a suite of metrics and an evaluation framework to assess Lifelong Learning in a principled way that is agnostic to specific domains or system techniques. Through five case studies, we show that this suite of metrics can inform the development of varied and complex Lifelong Learning systems. We highlight how the proposed suite of metrics quantifies performance trade-offs present during Lifelong Learning system development - both the widely discussed Stability-Plasticity dilemma and the newly proposed relationship between Sample Efficient and Robust Learning. Further, we make recommendations for the formulation and use of metrics to guide the continuing development of Lifelong Learning systems and assess their progress in the future.Comment: To appear in Neural Network

    Percentage of ALD papers containing a specific word pertaining to broad types of materials in their titles.

    No full text
    <p>Percentage of ALD papers containing a specific word pertaining to broad types of materials in their titles.</p

    Characterizing the field of Atomic Layer Deposition: Authors, topics, and collaborations

    No full text
    <div><p>This paper describes how Atomic Layer Deposition (ALD) has evolved over time using a combination of bibliometric, social network, and text analysis. We examined the rate of knowledge production as well as changes in authors, journals, and collaborators, showing a steady growth of ALD research. The study of the collaboration network of ALD scientists over time points out that the ALD research community is becoming larger and more interconnected, with a largest connected component that spans 90% of the authors in 2015. In addition, the evolution of network centrality measures (degree and betweenness centrality) and author productivity revealed the central figures in ALD over time, including new “stars” appearing in the last decade. Finally, the study of the title words in our dataset is consistent with a shift in focus on research topics towards energy applications and nanotechnology.</p></div

    Top 10 most productive authors in ALD.

    No full text
    <p>Top 10 most productive authors in ALD.</p
    corecore